Using technology from National Instruments, a team from the School of Mechanical Engineering, University of Leeds, has created a system to measure and dynamically simulate the forces perceived by a surgeon during palpation (examination through touch) in robot-assisted surgery
Worldwide, over ten million people per year are diagnosed with cancer, with more than one in three developing some form of the disease in their lifetime, and around one in four deaths caused by it.
Cancer commonly manifests itself as hard abnormal masses (tumours) embedded within softer tissue (organs). In the case of malignant tumours, early detection and accurate removal increases the patient’s likelihood of survival. Recent years have seen a transfer of surgical procedures from traditional open surgery to Minimally Invasive Surgery (MIS) and, more recently, to robot-assisted laparoscopic surgery. These have shown significant benefits over open surgery, but the lack of direct physical contact has resulted in the loss of haptic (force and touch) feedback, which is required for assessing tissue features through palpation (examination through touch).
So when we began developing a system to measure and dynamically simulate the forces perceived by a surgeon during palpation in robot-assisted surgery, we selected technology from National Instruments.
The simulation system delivers haptic feedback to a user during a virtual MIS palpation exercise, with potential applications include surgical training, and further development into a master/slave palpation device. The long term goal is to overcome the drawbacks of new technology in surgery for the detection and improved resection accuracy of tumours through palpation.
To achieve this, we required hardware I/O, third party hardware interfacing, virtual graphics and custom data handling and processing. We realised that the functionality required could be achieved by using NI LabVIEW and NI CompactDAQ.
To simulate the palpation of human tissue, LabVIEW was used to create a virtual environment in which the user is presented with a probe and tissue sample within a patient’s abdomen. Haptic interaction with the virtual environment is provided through the use of a haptic device. LabVIEW was also used to control a custom built physical testing environment, where silicon tissue models were palpated with a force sensing probe. The physical tests were primarily performed to validate the data obtained from the Finite Element Analysis (FEA) but, in addition, establishing communication between the physical testing environment and the haptic device as an opportunity to explore the system’s remote palpation capabilities. The response forces that are provided to the user in the LabVIEW virtual environment were determined using FEA.
In order to measure response forces from silicone tissue models during palpation, we developed a tri-axial Cartesian robotic system capable of moving an instrumented palpation probe relative to the tissue models. Using NI LabVIEW and CompactDAQ we were able to go from concept to solution in a matter of weeks. The system produces response surfaces of tissue models by recording force measurements during palpation at specified in-plane positions.
CompactDAQ provided a fast method of sending signals to our motor controllers and allowed us to record position and force measurements.
The system was programmed to run autonomously using a LabVIEW state machine architecture and allowed parameters such as indentation depth and palpation resolution to be adjusted from the front panel.
To simulate the visual and haptic aspects of palpation during surgery, we created a bespoke Dynamic Link Library (DLL) to interface with the haptic device (PHANToM Omni, SensAble Technologies). This allows two-way communication between LabVIEW and the OpenHaptics API to, for example, measure the device end-effector position and to programmatically implement forcing through the device. The ‘Call Library Function Node’ is used to export and import data to and from the DLL, enabling the set up of the required parameters for the system. This meant we could access the device’s functions and build ready-made subVIs to allow the developer to create flexible haptic scenes quickly and easily.
Forces are generated by sending pre-determined forcing variables to the DLL from LabVIEW. These are then implemented dynamically using a Gaussian function to generate a force in a haptic control loop that operates at a frequency of 1kHz. A stiffness function is then used to adjust the force as a function of the indentation depth, resulting in the generation of high-fidelity haptic feedback giving smooth forcing during tissue interaction. LabVIEW’s 3D toolbox was used to create the visual scene which includes a deformable tissue surface under manipulation of a robotic probe. A height array is programmatically updated depending on the position of the end effector to deliver representative visual deformation of the surface. Objects used within the final visualisation make use of virtual reality modelling language (VRML) CAD geometry files to increase the quality of the rendered scene. Coupling the user’s sense of touch with visual feedback in this way mimics real-world physical interaction.
Testing the system
A human factors study was then carried out on the final system to assess how well users could detect tumours in virtual tissues. This was automated within the code, allowing randomised tissue surfaces to be loaded automatically and other variables to be controlled programmatically, to improve the validity of our statistical results.
The functionality and the ability to integrate easily with third party technologies meant the NI hardware and LabVIEW not only met the demands of the project, but has left space for future system development.
T: 01635 523545
James Chandler, Matthew Dickson, Earle Jamieson, Thomas Mueller, Thomas Reid, Dr Peter Culmer & Dr Rob Hewson - School of Mechanical Engineering, University of Leeds