Computer Vision News - May 2021

Medical Imaging Projects 10 A newly developed module for fusion of MRI and ultrasound scans of the prostate offers new capabilities for tracking during various urological procedures. The new tool is based both on deep learning and classical methods, together with an understanding of the anatomical region of the prostate. Ultrasound can be used as an intra- op and real-time modality; however, important information cannot be seen, since certain regions do not react with the ultrasound’s audio waves. On the other hand, MRI shows a wide range of organs and pathologies, but it is very difficult to acquire in real time. Performing real-time registration and overlaying the information from the MRI scan on top of the ultrasound can produce benefit from both. All the important information can be showing in real-time in the correct location, providing a surgeon valuable information during surgery in the region of the prostate. One very useful usage example is biopsy. Certain tumors cannot be seen in ultrasound but are easily detectable in a standard MRI scan. A physician needs to points a needle to the correct place. By overlaying the Prostate MRI and Ultrasound Registration detected tumor from the MRI scan on top of the ultrasound display, which correctly depicts the current location of the organs, it is clear where the biopsy needle should aim at. Other important regions that are critical to detect in certain procedures are e.g. neurovascular bundle, sphincter, and others. These regions are important to avoid damaging as they can lead to undesired complications during and after surgery. Reducing procedural risk to patients is crucially important, as well as minimizing hospitalization time, costs and the need for higher surgical expertise. The new tool can be used also in robotic prostate surgeries. Here too, MRI information is not available, due to the complicated nature of its

RkJQdWJsaXNoZXIy NTc3NzU=