Effect of Simulation Learning on Radiology Resident Skills

3680 words (15 pages) Dissertation

17th Dec 2019 Dissertation Reference this

Tags: EducationMedicalCancer

Disclaimer: This work has been submitted by a student. This is not an example of the work produced by our Dissertation Writing Service. You can view samples of our professional work here.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of NursingAnswers.net.

What is the ‘utility’ of simulation training in the acquisition of interventional skills by radiology residents? 

Introduction and background 

The acquisition of interventional skills by radiology residents seems to vary greatly both between and within different residency programs (Gould, 2010). This is often due to the varied availability of trainer’s time or inconsistencies in the availability of ‘protected’ clinical sessions to teach such skills (Gaba, 2004).  This variable learning experience of interventional skills between residents might leave some residents feeling less confident in performing interventional procedures by the end of their training (Gould, 2010). It can also have an impact on residents’ future career ‘subspecialty’ choices.

Over the past decade, there has been a drive towards ‘standardising’ learning and assessments both at the undergraduate and postgraduate medical education level in an effort to ensure similar experiences between trainees (Gaba, 2004). Simulation teaching is now widely used both at the undergraduate and postgraduate medical-education level to teach practical and procedural skills to clinicians (Aggarwal et al., 2010).

The aim of this study is to review the existing literature in an effort to ascertain whether of ‘simulation training’ can be used as an effective method of learning interventional skills by radiology residents and the impact this might have on patient care and the wider organisation.

A literature search was performed using ‘OVID ® Medline Epub ahead of print, In-Process & Other Non-Indexed Citations, Daily, and Versions[R], 1946 to present’. The following advanced search terms were used: ‘‘radiology resident’’ or ‘‘radiology trainee’’ and ‘‘simulation training’’ along with ‘‘interventional skills’’ or ‘‘interventional procedures’’.  The advanced search terms yielded 63 articles.  The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement has been used as reference for identifying and including the relevant studies for this review (Moher et al., 2009).  Out of 63 studies, 4 duplications were removed (Figure 1). The titles of the remaining 59 studies were screened for relevance (Figure 1). Thirty-two articles were excluded by title and abstract. The remaining 18 ‘eligible’ articles were screened against the inclusion criteria and this resulted in 9 articles to be included in this review (Figure 2).

‘Thematic analysis’ has been used to analyse and organise the studies into different thematic categories (Castleberry and Nolen, 2018).  Three main ‘themes’ have been emerged from the data: the ‘utility’ of simulation-training for residents, patients and organisations.

Figure 2: Inclusion and exclusion criteria for study identification

 

 Main Body 

The ‘utility’ for residents

Several authors recognise that simulation-training provides a ‘safe’ environment whereby trainees can make and learn from mistakes in a ‘controlled’ way at no detriment to the patient (Andreatta et al., 2011; Narra et al., 2009). In addition, trainers have the ability to provide feedback ‘freely’ without being concerned with unduly worrying the patient, which often proves to be the limiting factor in providing ‘live’ feedback (Johnson et al., 2012).

Klass et al. (2008) also agree that one of the greatest advantages of simulation training is the ability to provide ‘live’ feedback to trainees who can immediately correct a mistake such as the incorrect placement of equipment. On the contrary, however, it cannot as of yet provide a simulation of ‘real-life’ complications, such as vessel dissection during an endovascular procedure and provide registrars the opportunity to train on how to manage such complications (Klass et al., 2008).

Furthermore, several studies note significant improvement in trainee confidence in performing the procedure both ‘under’ supervision and independently after the use of simulation training (Andreatta et al., 2011; Johnson et al., 2012; Klass et al., 2008).

One of the most reported benefits of simulation-training is the ability for the trainee to go back and ‘deconstruct’ their performance, learn from their mistakes and see which areas they need to ‘focus’ on (Johnson et al., 2012; Klass et al., 2008; Chaer et al., 2006 ). Some of the simulator models can provide very detailed and sophisticated feedback that the naked eye of their trainer would not be able to ‘pick-up’, such as: ‘wire stability’, ‘overshooting’ and ‘unnecessary motion’, during traditional teaching methods (Chaer et al. 2006). Trainees can greatly benefit from such feedback as they can know exactly what they need to focus on in order to improve (Chaer et al., 2006)

 

Moreover, as Mendiratta-Lala et al. (2010) point out that the traditional apprenticeship model does not offer a ‘robust’ method of assessing performance and suffers from ‘‘ evaluation biases’’. On the other hand, simulation-training was found to statistically improve the ‘‘retention of knowledge’’ and ‘‘proficiency and expertise in skills’’ (Mendiratta-Lala et al., 2010). In addition they argue that simulation can be used to ‘serve’ the learning needs of trainees who might be learning at different rates (Mendiratta-Lala et al., 2010). However, these results should be interpreted with caution as the internal validity of the study is limited as it has no ‘control group’.

Johnson et al. propose that incorporating ‘performance metrics’ into simulation training can prove as an effective way of using them to assess performance. For example, performance metrics such as ‘‘number of ‘no-go’ areas touched’’ and ‘‘ time in no go area’’ can be used as indicators for assessing individuals and deciding ‘when’ a trainee is ready to independently practice a procedure and when a trainee requires more support and training in a particular area (Johnson et al., 2012).

A study comparing radiology-resident and medical-student learning of vessel cannulation has found no significant difference in successful cannulation time between the two groups after the use of a simulator (Narra et al., 2009). The authors argue that simulation learning could prove extremely useful in the first few years of residency training and overcome the challenge of not getting enough ‘hands-on’ experience that most residents are faced with (Narra et al., 2009). However, the generalizability or these results might be limited as the study is based only in one institution and uses one type of ‘silicon’ vessel model. In addition, the authors have used a metric of procedural ‘success’ the improvement in cannulation time. This may be limiting the external validity of the study as the performance metric used might not be applicable to a clinical scenario where different vessels have different ‘elasticity’ and different levels of difficulty in cannulating and ,thus, time might not be the more representative metric of ‘success’ (Narra et al., 2009).

Glaiberman et al., (2008) found that procedural time and fluoroscopic-exposure time halved after 12 months of simulator use (Glaiberman et al., 2008). However, there was no data on complication-rates or technical quality of the procedure. In addition, there is the risk of trainees ‘learning’ the procedure on a simulator after repeated use of the same model and the simulator would then not be a representative measure of their level of technical-skills learning (Glaiberman et al. 2008).

The ‘utility’ for patients

As noted by Aggarwal et al., (2010), simulation-training not only aids technical skill performance but also encourages teamwork skills and a collaborative training environment (Aggarwal et al., 2010). It also reinforces communication skills amongst participants, which can have both a direct and indirect positive impact on patient safety (Aggarwal et al., 2010).

Several studies report a significant reduction in the procedural time after simulation training with subsequent reduction in both the radiation dose to the patient and staff member, whenever ionizing radiation had been used during a procedure (Aggarwal et al., 2010; Narra et al., 2009; Glaiberman et al., 2008; Klass et al., 2008 ). On the other hand, there are some concerns that simulation-training might encourage ‘‘negative training’’ amongst residents, if they become so focused on reducing procedural time at the expense of making more mistakes during the procedure (Klass et al., 2008).

In addition, reduction in patient-reported anxiety levels have been reported after the use of simulation training (Aggarwal et al., 2010). This might be due to the fact that trainees have adequate time to practice the steps of a procedure on a simulator prior to performing the procedure on the patient and might appear to be more confident in front of the patient (Aggarwal et al., 2010).

Whilst simulation training has proven to increase trainee confidence and familiarity with the use of equipment, it has not yet proven that is can significantly reduce intra-procedural mistakes and complications (Klass et al., 2008).  Glaiberman et al., (2008) argue that simulation could offer clinicians the opportunity to ‘rehearse’ a procedure before performing the case by acquiring the cross-sectional images from PACS and applying them to the simulator (Glaiberman et al., 2008). This could potentially enable clinicians to perform the procedure more safely on the day, however, this has not been investigated to date (Glaiberman et al., 2008).

The ‘utility’ for organisations

 

As Johnson et al. (2012) point out the cost of purchasing and maintaining a simulator as well as the training of staff are the main hurdles for adoption of simulation-training in clinical practice (Johnson et al., 2012). Glaiberman et al. (2008) agree with this point-of-view and argue that clinicians need to set time aside to supervise the initial simulation-training of residents in order to ensure the correct skills are being learned (Glaiberman et al., 2008).

On the other hand, Gaba (2004) highlights that simulation training has a lot to offer as it can act as a platform for continuous training development, a standardised tool for assessing individual performance and could be used for recruitment purposes (Gaba, 2004). Mediratta-Lara et al. (2010) agree that simulation can be used as a way of attempting to provide a more ‘‘uniform’’ standardized-type of training to radiology residents (Mendiratta-Lala et al., 2010).

Moreover, Mendiratta-Lala et al. (2010) argue that simulation can better prepare trainees in dealing with real-life emergencies as with ‘repeated’ practise they would have gained the necessary confidence and ‘proficiency’ to deal with the complexity that is required by emergency cases (Mendiratta-Lala et al., 2010).

As previously mentioned, several studies have reported reduction in fluoroscopic time with the use of simulation training (Aggarwal et al., 2010; Narra et al., 2009; Glaiberman et al., 2008; Klass et al., 2008). In everyday clinical practice, this can translate to less time spent per procedure and the ability to book more cases per clinical session which results in remuneration benefits for hospitals and less waiting times for patients (Klass et al., 2008).  In addition, Glaiberman et al. (2008) report more efficient use of resources by residents following simulation training, which could results in organisational ‘savings’ (Glaiberman et al., 2008).

Furthermore, as noted by Klass et al. (2008), simulation training can provide a continuing stream of cases and material for trainees to practice on without the added pressure on trainers and trainees to rely on the everyday ‘flow’ of cases (Klass et al., 2008). On the other hand, there are some concerns that the simulation-training environment might be too ‘‘controlled’’ and not representative of the complex clinical environment (Klass et al., 2008).

Several studies agree that one of the biggest challenges of simulation-training is finding a way to ensure transfer of simulation skills to everyday clinical cases (Klass et al., 2008; Mendiratta-Lala et al., 2010; Gould et al., 2010 ). Another unanswered question that remains is how to ‘best’ integrate simulation training into the traditional radiology training (Klass et al., 2008).

Conclusion

There is a unifying overall belief within the literature that simulation training can not only have a positive impact on the acquisition of interventional skills amongst radiology residents but also offer benefits to the organisation and patient care.

There is good evidence that demonstrates that simulation training improves residents’ familiarity with the use of equipment and confidence in performing a procedure. There is also strong evidence that fluoroscopic time and overall procedural time reduce after the use of simulation training by residents. It can be inferred that this is because the technical skills of radiology trainees have improved following simulation training, however, it is difficult to prove that technical skills have improved without looking at the complication rates of residents before and after simulation-training, which has not yet been done in the literature.

Similarly, there is lack of evidence regarding the direct effect simulation training can have on patient outcomes, such as success of procedure and complication rates. There is need for prospective studies to compare clinical outcomes with the use of simulation training versus traditional training model for radiology trainees.

There is retrospective evidence showing that simulation training has lead to reduction of procedural times, reduction of contrast-agent use in case of angiography and reduction in the use of the wrong equipment all of which can translate to cost savings for the organization. However, there are no level I evidence or cost-effectiveness analyses to date that have shown that the adoption of simulation training can lead to significant cost savings for the organisation.

The cohort of studies included in the review, however, is not exhaustive of all the procedures learned by radiology trainees in daily practice. The fact that simulation training is still under development and does not, currently, cover all the procedures learnt by trainees, can limit the generalizability of the systematic review’s results.

Furthermore, simulation training cannot always replicate the anatomical variations of cases unlike traditional training whereby the trainee is learning on cases of different complexity. This might limit the generalisability of the findings to the daily dynamic clinical environment.

The hope is that in the near future, as simulator-design becomes more sophisticated, simulators will be able to imitate complex cases and simulate complications of procedures and allow trainees to practice in a more ‘realistic’ environment that more closely represents clinical practice.

Future prospective studies are needed to establish whether simulation training provides better interventional-skill training in comparison to traditional training. In addition, a cost-effectiveness analysis would be beneficial in order to ascertain whether simulation training can offer a more cost-effective training model in comparison to traditional teaching methods.

 

In the not-so-far future, simulation-training may well form part of residents training curriculum and certification in an effort to offer a more objective assessment of their procedural skills than the more traditional ‘block’ in interventional radiology, where a resident might not be able to get a lot of hands-on experience.

At my current institution, there is no simulator at present but our ‘sister’ teaching hospital in the region owns one, which I was able to use recently during an interventional radiology course that was organised for radiology residents.

I am planning to present the literature review results to the Training Programme Director and Clinical Director of my institution’s radiology department and discuss with them the potential of purchasing or renting a simulator for radiology residents. If this is not financially feasible, I will suggest whether we can organise some ‘study days’ where residents form my institution could go to the ‘sister’ teaching hospital in our region and practise interventional skills on their simulator. I will also attempt to encourage our interventional radiology trainers to attend the ‘sister’ institution’s training facility in an effort to assess and evaluate the simulator and offer guidance and feedback to trainees regarding its use.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

References

 

 

  1. Gould, D. (2010) ‘Using simulation for interventional radiology training’, British Journal of Radiology, 83(991), pp. 546–553. doi: 10.1259/bjr/33259594.

 

  1. Gaba, D. M. (2004) ‘The future vision of simulation in health care’, Quality and Safety in Health Care, 13(SUPPL. 1), pp. 2–10. doi: 10.1136/qshc.2004.009878.

 

  1. Aggarwal, R. et al. (2010) ‘Training and simulation for patient safety’, Quality and Safety in Health Care, 19(Suppl 2), pp. i34–i43. doi: 10.1136/qshc.2009.038562.
  1. Moher, D. et al. (2009) ‘Preferred {Reporting} {Items} for {Systematic} {Reviews} and {Meta}-{Analyses}: {The} {PRISMA} {Statement}’, Journal of Clinical Epidemiology, 62(10), pp. 1006–1012. doi: 10.1016/j.jclinepi.2009.06.005.
  1. Castleberry, A. and Nolen, A. (2018) ‘Thematic analysis of qualitative research data: Is it as easy as it sounds?’, Currents in Pharmacy Teaching and Learning. Elsevier, 10(6), pp. 807–815. doi: 10.1016/j.cptl.2018.03.019.
  1. Andreatta, P. et al. (2011) ‘Simulation-based training improves applied clinical placement of ultrasound-guided PICCs’, Supportive Care in Cancer, 19(4), pp. 539–543. doi: 10.1007/s00520-010-0849-2.
  1. Narra, P. et al. (2009) ‘Videoscopic Phantom-based Angiographic Simulation: Effect of Brief Angiographic Simulator Practice on Vessel Cannulation Times’, Journal of Vascular

 

  1. Johnson, S. J. et al. (2012) ‘Virtual reality, ultrasound-guided liver biopsy simulator: Development and performance discrimination’, British Journal of Radiology, 85(1013), pp. 555–561. doi: 10.1259/bjr/47436030.

 

  1. Klass, D. et al. (2008) ‘Training on a vascular interventional simulator: An observational study’, European Radiology, 18(12), pp. 2874–2878. doi: 10.1007/s00330-008-1090-y.
  1. Chaer, R. A. et al. (2006) ‘Simulation improves resident performance in catheter-based intervention: Results of a randomized, controlled study’, Annals of Surgery, 244(3), pp. 343–349. doi: 10.1097/01.sla.0000234932.88487.75.
  1. Mendiratta-Lala, M. et al. (2010) ‘The Use of a Simulation Center to Improve Resident Proficiency in Performing Ultrasound-Guided Procedures’, Academic Radiology. Elsevier Ltd, 17(4), pp. 535–540. doi: 10.1016/j.acra.2009.11.010.
  1. Glaiberman, C. B. et al. (2008) ‘Simulation in Training: One-year Experience Using an Efficiency Index to Assess Interventional Radiology Fellow Training Status’, Journal of Vascular and Interventional Radiology, 19(9), pp. 1366–1371. doi: 10.1016/j.jvir.2008.05.021.

Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

DMCA / Removal Request

If you are the original writer of this dissertation and no longer wish to have your work published on the UKDiss.com website then please: