Simulation of image guided needle puncture: contribution to real-time ultrasound and fluoroscopic rendering, and volume haptic rendering

Electronic versions

Documents

  • Franck Vidal

Abstract

The potential for the use of computer graphics in medicine has been well established. An important emerging area is the provision of training tools for interventional radiology (IR) procedures. These are minimally invasive, targeted treatments performed using imaging for guidance. Training of the skills required in IR is an apprenticeship which still consists of close supervision using the model i) see one, ii) do one and iii) teach one. Simulations of guidewire and catheter insertion for IR are already commercially available. However, training of needle guidance using ultrasound (US), fluoroscopic or computed tomography (CT) images - the first step in approximately half of all IR procedures - has been largely overlooked and we have developed a simulator, called BIGNePSi, to provide training for this commonly performed procedure.

This thesis is devoted to the development of novel techniques to provide an integrated visual-haptic system for the simulation of US guided needle puncture using patient specific data with 3D textures and volume haptics. The result is the realization of a cost effective training tool, using off-the-shelf components (visual displays, haptic devices and working stations), that delivers a high fidelity training experience.

We demonstrate that the proxy-based haptic rendering method can be extended to use volumetric data so that the trainee can feel underlying structures, such as ribs and bones, whilst scanning the surface of the body with a virtual US transducer. A volume haptic model is also proposed that implements an effective model of needle puncture that can be modulated by using actual force measurements. A method of approximating US-like images from CT data sets is also described. We also demonstrate how to exploit today's graphics cards to achieve physically-based simulation of x-ray images using GPU programming and 3D texture hardware. We also demonstrate how to use GPU programming to modify, at interactive framerates, the content of 3D textures to include the needle shaft and also to artificially add a tissue lesion into the dataset of a specific patient. This enables the clinician to provide students with a wide variety of training scenarios.

Validation of the simulator is critical to its eventual uptake in a training curriculum and a project such as this cannot be undertaken without close co-operation with the domain experts. Hence this project has been undertaken within a multi-disciplinary collaboration involving practising interventional radiologists and computer scientists of the Collaborators in Radiological Interventional Virtual Environments (CRaIVE) consortium. The cognitive task analysis (CTA) for freehand US guided biopsy performed by our psychologist partners has been extensively used to guide the design of the simulator. In addition, to ensure that the fidelity of the simulator is at an acceptable level, clinical validation of the system's content has been carried out at each development stage. In further, objective evaluations, questionnaires were developed to evaluate the features and the performances of the simulator. These were distributed to trainees and experts at different workshops. Many suggestions for improvements were collected and subsequently integrated into the simulator.

Details

Original languageEnglish
Awarding Institution
Supervisors/Advisors
  • Nigel John (Supervisor)
Award dateJan 2008