Skip to content
  • Our Work
    • Fields
      • Cardiology
      • ENT
      • Gastro
      • Orthopedics
      • Ophthalmology
      • Pulmonology
      • Surgical
      • Urology
      • Other
    • Modalities
      • Endoscopy
      • Medical Segmentation
      • Microscopy
      • Ultrasound
  • Success Stories
  • Insights
    • Magazine
    • Events and Webinars
    • News
    • Blog
  • The company
    • About us
    • Careers
Menu
  • Our Work
    • Fields
      • Cardiology
      • ENT
      • Gastro
      • Orthopedics
      • Ophthalmology
      • Pulmonology
      • Surgical
      • Urology
      • Other
    • Modalities
      • Endoscopy
      • Medical Segmentation
      • Microscopy
      • Ultrasound
  • Success Stories
  • Insights
    • Magazine
    • Events and Webinars
    • News
    • Blog
  • The company
    • About us
    • Careers
Contact

Instrument tracking for robotic aided surgery

The success of complex fine-scale surgery depends heavily on surgeons’ agility, state of fatigue and their freedom to manipulate miniature instruments in the target organ. Internal organs can be reached by an insertion of flexible instruments through several small incisions in minimally invasive surgery, which reduces patients’ recovery time due to lesser tissue trauma. In addition, the precision of open surgery can be increased by directing sophisticated instruments, equipped with endoscopic cameras and sensors, to the target organ. Such advancement in surgical instrumentation has extended the physical ability of surgeons, allowing them to operate with better visibility and more freedom of operation by controlling several robotic arms simultaneously.
Flexible medical instruments for endoscopic surgery
Flexible medical instruments for endoscopic surgery

Although the robotic surgery arms have increased significantly the surgeons’ freedom of operation, an important aspect still remains in their full control: namely, the sensory feedback. Stereoscopic vision allows surgeons to estimate distances between instruments and tissue, and the slave-robotic arms translate surgeon’s crude movement to fine scale adapted to the size of the tissue. Despite that, a touch sensory feeling is completely missing from robotic surgery aids. To enable such sensory feeling, instruments need to be equipped with miniature touch sensors, which are currently expensive to integrate.

To obtain information about tissue deformation during surgery, both stiff instruments and soft tissues need to be tracked in real-time. In instrument tracking for robotic aided surgery, the characteristic of the targets are very diverse. Instruments have well defined geometry and a predictable relative position with regards to the camera. On the other hand, tissue structure can change its shape due to internal organ movement and its appearance due to lighting conditions and camera visibility. Once the surgeon applies force on a tissue using an instrument, tissue deformation needs to be estimated. The measure of deformation can be a valuable tool in estimating both organ properties and form as an initial step in the construction of visually-based sensory feedback for robotic surgery equipment.

To track tissues, a mesh-like model needs to be fitted to the visual data. Salient features in the target tissue are first extracted and the mesh model is fitted to it. Energy minimization methods can be employed to restrict the mesh from obtaining abnormal shapes, such as self-crossing mesh and sharp deformations. Salient features detection relies in part on illumination conditions. Tissue with high reflectance will tend to obscure salient features and will result in deteriorated mesh fit quality. Regulation of the illuminating source by its proximity to tissue will be dealt with in other articles. We assume therefore that illumination is regulated such that reflectance poses no serious setback and that the energy functional defined for mesh fitting can overcome such difficulties.

Since we are dealing with real time tracking, temporal information needs to be taken into account for mesh fitting. A mesh (for example triangulated) is composed of a set of points with position relationship between them. Such information can be used to regularize the mesh as it deforms with the tissue across frames. The mesh needs to disregard the position of the tracked instrument which should therefore be tracked accurately to prevent the mesh fitting from failing. The predicted shape of instruments such as robotic tweezers can be modeled beforehand and fitted to visual data at every frame. The interface between the instrument and tissue can be only indirectly deduced by the quantitative information obtained by mesh deformation after pressure is applied to the tissue.

Real-time tracking is one of the most common tasks in computer vision. In controlled environments such tasks are performed with high accuracy. However, real-life data, and especially those obtained in noisy or insufficiently lit endoscopic cameras pose serious challenges. The experienced team of engineers at RSIP Vision has been addressing such challenges with success for many medical applications. RSIP Vision has been operating in the field of computer vision for over 25 years. To learn more about how RSIP Vision’s algorithms can be used in robotic surgery and other medical applications, you can visit our projects page and also read more articles in Computer Vision News.

Share

Share on linkedin
Share on twitter
Share on facebook

Main Field

Surgical

In the recent years, robotic modules with their associated algorithmic software emerged in the surgery rooms. Here, some of their applications will be reviewed. They are used for orthopedic, brain and cardiac surgeries.

View Surgical

Categories

  • RSIP Vision Learns, Surgical

Related Content

Single-port

How Can AI Assist in Single-port Robotic Surgery?

Data Generation in Robotic Assisted Surgeries (RAS)

Data Generation in Robotic Assisted Surgeries (RAS)

Success Rating and Dynamic Feedback in RAS

Success Rating in Robotic Assisted Surgeries

RAS Navigation

Tissue Sparing in Robotic Assisted Orthopedic Surgeries

C Arm X-Ray Machine Scanner

Radiation Reduction in Robotic Assisted Surgeries (RAS) Using AI

Visible spectrum color

Hyperspectral Imaging for Robotic Assisted Surgery

Single-port

How Can AI Assist in Single-port Robotic Surgery?

Data Generation in Robotic Assisted Surgeries (RAS)

Data Generation in Robotic Assisted Surgeries (RAS)

Success Rating and Dynamic Feedback in RAS

Success Rating in Robotic Assisted Surgeries

RAS Navigation

Tissue Sparing in Robotic Assisted Orthopedic Surgeries

C Arm X-Ray Machine Scanner

Radiation Reduction in Robotic Assisted Surgeries (RAS) Using AI

Visible spectrum color

Hyperspectral Imaging for Robotic Assisted Surgery

Show all

RSIP Vision

Field-tested software solutions and custom R&D, to power your next medical products with innovative AI and image analysis capabilities.

Read more about us

Get in touch

Please fill the following form and our experts will be happy to reply to you soon

Recent News

Announcement – XPlan.ai Confirms Premier Precision in Peer-Reviewed Clinical Study of its 2D-to-3D Knee Reconstruction Solution

IBD Scoring – Clario, GI Reviewers and RSIP Vision Team Up

RSIP Neph Announces a Revolutionary Intra-op Solution for Partial Nephrectomy Surgeries

Announcement – XPlan.ai by RSIP Vision Presents Successful Preliminary Results from Clinical Study of it’s XPlan 2D-to-3D Knee Bones Reconstruction

All news
Upcoming Events
Stay informed for our next events
Subscribe to Our Magazines

Subscribe now and receive the Computer Vision News Magazine every month to your mailbox

 
Subscribe for free
Follow us
Linkedin Twitter Facebook Youtube

contact@rsipvision.com

Terms of Use

Privacy Policy

© All rights reserved to RSIP Vision 2023

Created by Shmulik

  • Our Work
    • title-1
      • Ophthalmology
      • Uncategorized
      • Ophthalmology
      • Pulmonology
      • Cardiology
      • Orthopedics
    • Title-2
      • Orthopedics
  • Success Stories
  • Insights
  • The company