Developing a needle guidance virtual environment with patient specific data and force feedback
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › peer-review
Standard Standard
Proceeding of the 19th International Congress of Computer Assisted Radiology and Surgery (CARS'05). Vol. 1281 Berlin, Germany: Elsevier, 2005. p. 418-423 (International Congress Series).
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › peer-review
HarvardHarvard
APA
CBE
MLA
VancouverVancouver
Author
RIS
TY - GEN
T1 - Developing a needle guidance virtual environment with patient specific data and force feedback
AU - Vidal, F. P.
AU - Chalmers, N.
AU - Gould, D. A.
AU - Healey, A. E.
AU - John, N. W.
PY - 2005/5
Y1 - 2005/5
N2 - We present a simulator for guided needle puncture procedures. Our aim is to provide an effective training tool for students in interventional radiology (IR) using actual patient data and force feedback within an immersive virtual environment (VE). Training of the visual and motor skills required in IR is an apprenticeship which still consists of close supervision using the model: (i) see one, (ii) do one, and (iii) teach one. Training in patients not only has discomfort associated with it, but provides limited access to training scenarios, and makes it difficult to train in a time efficient manner. Currently, the majority of commercial products implementing a medical VE still focus on laparoscopy where eye-hand coordination and sensation are key issues. IR procedures, however, are far more reliant on the sense of touch. Needle guidance using ultrasound or computed tomography (CT) images is also widely used. Both of these are areas that have not been fully addressed by other medical VEs. This paper provides details of how we are developing an effective needle guidance simulator. The project is a multi-disciplinary collaboration involving practising interventional radiologists and computer scientists.
AB - We present a simulator for guided needle puncture procedures. Our aim is to provide an effective training tool for students in interventional radiology (IR) using actual patient data and force feedback within an immersive virtual environment (VE). Training of the visual and motor skills required in IR is an apprenticeship which still consists of close supervision using the model: (i) see one, (ii) do one, and (iii) teach one. Training in patients not only has discomfort associated with it, but provides limited access to training scenarios, and makes it difficult to train in a time efficient manner. Currently, the majority of commercial products implementing a medical VE still focus on laparoscopy where eye-hand coordination and sensation are key issues. IR procedures, however, are far more reliant on the sense of touch. Needle guidance using ultrasound or computed tomography (CT) images is also widely used. Both of these are areas that have not been fully addressed by other medical VEs. This paper provides details of how we are developing an effective needle guidance simulator. The project is a multi-disciplinary collaboration involving practising interventional radiologists and computer scientists.
KW - Interventional radiology
KW - Virtual environments
KW - Needle puncture
KW - Haptics
U2 - 10.1016/j.ics.2005.03.200
DO - 10.1016/j.ics.2005.03.200
M3 - Conference contribution
VL - 1281
T3 - International Congress Series
SP - 418
EP - 423
BT - Proceeding of the 19th International Congress of Computer Assisted Radiology and Surgery (CARS'05)
PB - Elsevier
CY - Berlin, Germany
T2 - 19th International Congress of Computer Assisted Radiology and Surgery
Y2 - 22 June 2005 through 25 June 2005
ER -