Computer Vision News - October 2021

Étienne Léger recently completed his PhD at Concordia University under the supervision of Marta Kersten-Oertel. His research interest lies in Human Computer Interaction, evaluating how new methods can potentially improve neurosurgical workflows. His research focused on developing and assessing neurosurgical guidance tools making use of novel paradigms, methods and hardware to make it more intuitive and interactive. He believes that through new hardware integration, neurosurgical guidance can be made more accessible, which can lead to improved patient outcomes. It is estimated that 13.8 million patients per year require neurosurgical interventions worldwide, be it for a cerebrovascular disease, stroke, tumour resection, or epilepsy treatment, among others. These procedures involve navigating through and around complex anatomy in an organ where damage to eloquent healthy tissue must be minimized. Neurosurgery thus has very specific constraints compared to most other domains of surgical care. These constraints have made neurosurgery particularly suitable for integrating new technologies. Any new method that has the potential to improve surgical outcomes is worth pursuing, as it has the potential to not only save and prolong lives of patients, but also increase the quality of life post-treatment. In his work, Étienne developed novel neurosurgical image-guidance methods, making use of currently available, low-cost off-the-shelf components. In particular, a mobile device (e.g. smartphone or tablet) is integrated into a neuronavigation framework to explore new augmented reality visualization paradigms and novel intuitive interaction methods. The developed system, called MARIN for Mobile Augmented Reality Interactive Neuronavigator, aims at improving image-guidance using augmented reality to improve intuitiveness and ease of use. Further, gestures on the mobile device are used to increase interactivity with the neuronavigation system. These touchscreen interactions enable partially mitigating the problem of accuracy loss or brain shift that occurs during surgery. It also gives the control over the visualization back to the surgeon, enabling them to switch between different visualization methods, be it traditional cut-planes guidance, virtual 3D models extracted from the preoperative scan or a virtual reality view where segmented structures are overlaid into the surgical field in real-time. The AR view is also customizable: structures (e.g. vessels, cortex surface, etc.) can be individually added or removed from the view and the augmentation can be limited to only 62 Congrats, Doctor!

RkJQdWJsaXNoZXIy NTc3NzU=