The INSPIRE Lab at IIT Madras is focused on developement of safe and affordable robotic systems for surgical and image guided interventions and their clinical traslation. It is a truely interdisciplinary place with research opportunities in various disciplines in engineering and design including mechanical, biomedical, computer science, electrical and electronics engineering.
"It is difficult to say what is impossible, for the dream of yesterday is the hope of today and the reality of tomorrow."...Robert H. Goddard(WPI)
Research
Dr. Patel has worked on developement and clinical/preclinical evaluation of various robotic systems for MRI-guided cancer diagnosis and treatment and safety in robot-assisted retinal surgeries using machine learning. INSPIRE Lab will be exploring similar research problems in X-Ray/CT/Ultrasound guided robotic systems and their evaluation in real cilnical environment and developement of next generation technologies in minimally invasive robotic surgeries. If this sounds like a place where you would like to be at, please contact Dr. Patel for more information.
Assistant Professor Dept. of Engineering Design IIT Madras, Chennai
Mukund Shah
MS Candidate Dept. of Engineering Design IIT Madras, Chennai
CONTACT US
INSPIRE Lab is always looking for talented undergraduate, graduate and PhD students. If you are interested in any of the above mentioned research areas, please contact Dr. Nirav Patel. Also, if you are looking for postdoctoral opportunities please look at IIT Madras Institute Post-Doctoral Fellowship opportunities.
Please email Dr. Nirav Patel at
Image-guided robotic surgery
MRI-Guided Prostate Biopsy
Dr. Patel worked on development of a robotic system for targeted transperineal prostate biopsy under direct interventional magnetic resonance imaging (MRI) guidance. The clinically integrated robotic system is developed based on a modular design approach, comprised of surgical navigation application, robot control software, MRI robot controller hardware, and robotic needle placement manipulator. The system provides enabling technologies for MRI-guided procedures. It can be easily transported and setup for supporting the clinical workflow of interventional procedures, and the system is readily extensible and reconfigurable to other clinical applications. System has been evaluated in phantom studies as well as 30 patient trials at Brigham and Women's Hospital.
Dr. Patel developed an integrated robotic system to precisely ablate deep brain tumors under real-time MRI guidance using needle-based therapeutic ultrasound (NBTU). This system is clinically optimized to perform and monitor an in-bore ablation procedure. The integrated system comprises of an updated robotics manipulator, the NBTU ablation system, surgical planning and navigation applications and custom developed application for real-time thermal dosage monitoring.This system has been evaluated in phantom and in 10 porcine trials demonstrating targeting and realtime monitoring of the ablation region via magnetic resonance thermal imaging (MRTI).
This project aims at development of a body mounted robotic system for MRI-guided shoulder arthrography in pediatric patients. The robotic manipulator is optimized for being accurate yet light enough to perform the contrast agent injection and joint examination imaging inside the MRI bore. It provides 4 degrees of freedom (DOF) to align the needle guide to an accurate insertion trajectory. In shoulder arthrography procedures, contrast agent is injected under fluoroscope guidance resulting in radiation exposure which should be avoided for pediatric patients. Also after contrast agent injection typically MRI images are acquired for examination resulting in two stage procedure. This system allows clinicians to perform both contrast agent injection and joint examination under MRI guidance, hence completely eliminating radiation exposure from fluoroscope guidance and patient movement from X-Ray/CT room to MRI suite.
Purpose: This paper presents the development of a body-mounted robotic assistant for magnetic resonance imaging (MRI)-guided low back pain injection. Our goal was to eliminate the radiation exposure of traditional X-ray guided procedures while enabling the exquisite image quality available under MRI. The robot is designed with a compact and lightweight profile that can be mounted directly on the patient's lower back via straps, thus minimizing the effect of patient motion by moving along with the patient. The robot was built with MR-conditional materials and actuated with piezoelectric motors so it can operate inside the MRI scanner bore during imaging and therefore streamline the clinical workflow by utilizing intraoperative MR images. Methods: The robot is designed with a four degrees of freedom parallel mechanism, stacking two identical Cartesian stages, to align the needle under intraoperative MRI-guidance. The system targeting accuracy was first evaluated in free space with an optical tracking system, and further assessed with a phantom study under live MRI-guidance. Qualitative imaging quality evaluation was performed on a human volunteer to assess the image quality degradation caused by the robotic assistant. Results: Free space positioning accuracy study demonstrated that the mean error of the tip position to be 0.51 ∓ 0.27 mm and needle angle to be 0. 70 ∓ 0. 38 deg. MRI-guided phantom study indicated the mean errors of the target to be 1.70 ∓ 0.21 mm, entry point to be 1.53 ∓ 0.19 mm, and needle angle to be 0. 66 ∓ 0. 43 deg. Qualitative imaging quality evaluation validated that the image degradation caused by the robotic assistant in the lumbar spine anatomy is negligible. Conclusions: The study demonstrates that the proposed body-mounted robotic system is able to perform MRI-guided low back injection in a phantom study with sufficient accuracy and with minimal visible image degradation that should not affect the procedure.
One of the major yet little recognized challenges in robotic vitreoretinal surgery is the matter of tool forces applied to the sclera. Tissue safety, coordinated tool use and interactions between tool tip and shaft forces are little studied. The introduction of robotic assist has further diminished the surgeon's ability to perceive scleral forces. Microsurgical tools capable of measuring such small forces integrated with robotmanipulators may therefore improve functionality and safety by providing sclera force feedback to the surgeon. In this paper, using a force-sensing tool, we have conducted robotassisted eye manipulation experiments to evaluate the utility of providing scleral force feedback. The work assesses 1) passive audio feedback and 2) active haptic feedback and evaluates the impact of these feedbacks on scleral forces in excess of aboundary. The results show that in presence of passive or active feedback, the duration of experiment increases, while the duration for which scleral forces exceed a safe threshold decreases.
OBJECTIVE: Robotics-assisted retinal microsurgery provides several benefits including improvement of manipulation precision. The assistance provided to the surgeons by current robotic frameworks is, however, a passive support, e.g., by damping hand tremor. Intelligent assistance and active guidance are, however, lacking in the existing robotic frameworks. In this paper, an active interventional robotic system (AIRS) has been presented to increase operation safety by actively intervening the operation to avoid exertion of excessive forces to the sclera. METHOD(S): AIRS consists of four components: 1) the steady-hand eye robot as the robotic module; 2) a sensorized tool to measure tool-to-sclera forces; 3) a recurrent neural network to predict occurrence of undesired events based on a short history of time series of sensor measurements; and 4) a variable admittance controller to command the robot away from undesired instances. RESULT(S): Performance of the proposed framework has been validated through a set of user studies involving 14 participants (with 4 surgeons). The users were asked to perform a vessel-following task on an eyeball phantom with the assistance of AIRS as well as the other two benchmark approaches, i.e., auditory feedback (AF) and real-time force feedback (RF). Statistical analysist shows that AIRS results in a significant reduction of proportion of undesired instances to about 2.5%, compared to 38.4% and 26.2% using AF and RF approaches, respectively. CONCLUSION(S): AIRS can effectively predict excessive-force instances and augment performance of the user to avoid undesired events during robot-assisted microsurgical tasks. SIGNIFICANCE: The proposed system may be extended to other fields of microsurgery and may potentially reduce tissue injury. INSPIRE Lab will be exploring similar research areas in laparoscopic and microsurgery applications.