Pillar 3: Trustworthy Artificial Intelligence for Sensory-rich Surgical Robotics
Surgical robots are widely considered as a key enabler to improve and standardise patient outcomes. Autonomy and collaborative human-robot behaviours, however, are mostly non-existent in current operations [1].
Existing surgical robotic systems operate under simple leader/follower schemes, and blindly copy the motion of the human operator or follow predetermined paths. There is no autonomy, and, for most procedures, robotic positioning precision alone is unrelated to ultimate surgical outcomes. While robotic surgeons can execute highly precise gestures that rapidly and stably position instruments in many degrees of freedom (DoFs), current robotic system cannot inform, adapt or safeguard the interventional process.
Implementing surgical robot autonomy is a hard problem. This is no surprise knowing that while 40 hours of training are sufficient for humans to learn how to drive, it takes almost 20,000 hours to train a surgeon. Autonomy in surgical robotics can speed up and improve training. It can provide intraoperative decision support to upskill junior surgeons and increase the confidence of experienced ones when undertaking complex cases. Ultimately, it transfers expertise across surgeons, improving interventional outcomes for patients. To move away from simplistic gesture-following (level 0), it is essential to start delivering robot assistance (level 1) before enabling task autonomy (level 2). It is also essential to do so within mixed human-robot surgical teams.
Research into autonomy in surgical robots thus faces distinct challenges in comparison to industrial and service robotics. Adoption and validation of such technology indeed requires the deployment of artificial intelligence (AI) that can be embedded in, and trusted by, existing human surgical teams. Developing rich sensing capabilities and multisensory feedback mechanisms between the robotic system and the human operators is also critical to design cognitive robotic devices able to work synergistically next to the human surgical team, right at the patient’s side.
The aim of this pillar is to lay the foundations of a sustainable programme leading to surgical robot autonomy in collaborative human-robot teams. The team will advance the field across four directions feeding in this ambition: Trustworthy AI, computational ultrasonography, knowledge extraction from connected medical devices, and sensory-rich human-machine interfaces. Current work includes developing:
- Scientific framework for Trustworthy AI with initial applications to medical images.
- Adaptive AI-powered ultrasound imaging for actionable surgical guidance.
- Deep learning models supporting heterogeneous and asynchronous information sources.
- Multisensory feedback mechanisms for surgical decision support.
Increasing autonomy in surgical robotics is poised to generate significant patient and healthcare system impact, creating a level playing field across surgical capabilities, adding safeguards within the interventional procedures, and increasing treatment throughput without sacrificing standards of care.
A list of open-source software projects related to this pillar can be found on the following GitHub repository: https://kcl-bmeis.github.io/TWAISurgRob-OSS-List/
References
- Attanasio, A. et al. (2020) ‘Autonomy in Surgical Robotics’, Annual Review of Control, Robotics, and Autonomous Systems. doi: 10.1146/annurev-control-062420-090543.
- Jacovi, A. et al. (2021) ‘Formalizing Trust in Artificial Intelligence: Prerequisites, Causes and Goals of Human Trust in AI’, in Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency. New York, NY, USA: Association for Computing Machinery (FAccT ’21), pp. 624–635. doi: 10.1145/3442188.3445923.
- Nair, A. A. et al. (2020) ‘Deep Learning to Obtain Simultaneous Image and Segmentation Outputs from a Single Input of Raw Ultrasound Channel Data’, IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, 67(12), pp. 2493–2509. doi: 10.1109/TUFFC.2020.2993779.
- Maier-Hein, L. et al. (2017) ‘Surgical data science for next-generation interventions’, Nature Biomedical Engineering, 1(9), pp. 691–696. doi: 10.1038/s41551-017-0132-7.
- Díez, S. P. et al. (2019) ‘Evaluation of Haptic Feedback on Bimanually Teleoperated Laparoscopy for Endometriosis Surgery’, IEEE Transactions on Biomedical Engineering, 66(5), pp. 1207–1221. doi: 10.1109/TBME.2018.2870542.