Project Details
Characterising the procedure of human decision-making in cystoscopy detection and diagnosis for the purpose of optimising AI solutions for risk profiling of bladder cancer.
Applicant
Shane O' Sullivan, Ph.D.
Subject Area
Reproductive Medicine, Urology
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
since 2023
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 520382567
Bladder cancer (BCa) is the 10th most common form of cancer worldwide, with an estimated 549,000 new cases and 200,000 deaths in 2018. Cystoscopy is the first and most important step in the diagnosis of BCa. About a million cystoscopies are performed every year in the USA alone. Since its introduction by German urologist Maximilian Nitze over 130 years ago, it has improved continuously regarding patient comfort and diagnostic accuracy. However, despite having guidelines, cystoscopic findings are diverse and often challenging to classify. The extent of the false negatives/positives in cystoscopy diagnosis is currently unknown. We suspect that there is a degree of under-diagnosis (failure to detect malignant tumors) and over-diagnosis (sending the patient for an unnecessary transurethral resection of bladder tumors [TUR-BTs]/biopsy with anesthesia) that put the patient at risk. Our hypothesis is that urologists possess implicit (tacit) knowledge about potential malignancy that can be made explicit by analyzing their eye-tracking and feedback data with respect to cystoscopy. The implicit and explicit knowledge can then be used in building training datasets for optimizing AI algorithms. There is a lack of literature on: i) human decision- making in cystoscopy detection and diagnosis; ii) standardized documentation and descriptions for bladder lesions; iii) eye-tracking work related to machine learning. To fill the gaps in the literature, this project proposes to conduct a study that uses an eye-tracker to: 1) track what an expert urologist is viewing whilst performing live cystoscopy during TURs; and 2) track what independent urology consultants are viewing whilst reviewing DVD recordings of the cystoscopy performed by the expert urologist. To overcome the limitations of eye-tracking, the experimentation will involve an iterative multi-stage process with inputs from expert urologists by specified forms of eye-tracking that rely on voice-recording with pinpointing (e.g. urologists use a digital pen to immediately pinpoint the lesion and record their verbalized descriptions about what they view and its meaning). While the urologists are analyzing the bladder, the plan is to use eye-tracking to uncover their decision patterns. Our central goal is to optimize AI solutions that assist urologists in cystoscopy detection and diagnosis by training the AI with datasets of human decision patterns. The two objectives are: a) Curate the cystoscopy and experimental data; b) Collaborate with urologists to characterize the decision-making procedure. The findings from this project will support our collaborators’ consortium in the optimization of their AI software. The software provides a professional second opinion for cystoscopy in order to more accurately predict and reduce recurrence/progression of BCa, and thus avoid unnecessary TURs/biopsies. This increases patient safety. We lay the groundwork to develop AI solutions for other endoscopies (e.g. colonoscopy).
DFG Programme
Research Grants