Biophysics inspired AI for dynamic perfusion quantification in living tissues - overview


Our solution aims to provide novel biophysics-inspired AI tools for “moving from a still image to flow of information” encoded in dynamic behaviour , such as uptake and release rates: this information will be extracted, processed and the outcomes will be made available to the surgeon during surgery through an Augmented Reality view which will overlay it on the real-time feed from the Clinical Fluorescence Imaging System (CFIS).   Such  Augmented Reality for Surgeons(ARS) decision support system will augment human judgement and  make it more precise: it will enable the surgeon to distinguish between various anatomical structures and detect malignant tissues and their margins in almost real time and with accuracy well beyond what is possible today. In addition, by closing the “loop between pathology and surgery”, ARS will link surgical outcomes to the pathology results (e.g. healthy tissue or malignant or benign tumour) to continuously amend its tissue classifier overtime.  The latter  can then be used to support individual decisions of any surgeon including those with less experience thus providing a surgeon with direct access to relevant collective expert knowledge base. Taken together, the result of integrating the ARS system into colorectal surgery-pathology  workflow will be to positively disrupt outcomes.


Required skills:

1) Python (OpenCV, visualisation), C++, and video/image processing (optical flow, object tracking),

2) some experience connecting hardware and efficiently using its associated APIs/SDKs.