Bay Vision - Spring 2018
As we get more and more data, we are looking towards moving to deep learning. In general, I would say most of the heavy lifting happens in pre-processing – just segmenting the sponge from the background, the blood from the sponge, dealing with background effects, dealing with differences in lighting. ” The whole process of scanning a sponge starts with someone holding it up and their system recognizing that there is a sponge and segmenting it out. Kevin says they were struggling to recognize a sponge just from color, because the sponges are red and white, and walls are often white, gloves are white, hazard bags are red, etc. It was possible for there to be a bleed-through between the white of the sponge and the white of some other object. A depth sensor has helped with this, so that is what they are using to recognize the sponge and get an extremely accurate segmentation of it. Now that Apple has released its iPhone X with TrueDepth sensor they are hoping at some point it will be added to the iPad. Drew adds: “The fantastic thing about this is that very often computer vision scientists have a chance to really control what camera they are using, and it is very difficult for us to manufacture medical equipment (with specialized cameras), it is extremely expensive. What Kevin has done is he has figured out how to do all this, the ambient light conditioning and everything, just using the standard front camera on an iPad.” What if by mistake the same sponge is shown twice to the camera? Kevin says they are working on improving the accuracy of the product using pattern recognition , because it is really important that they can account for all the sponges. Believe it or not, there is a problem with doctors leaving objects inside patients – in medical terms, it is called a retained foreign object, and can obviously pose a big risk to the patient. If they could add additional features, Kevin tells us he would like to be able to estimate the volume of clots. He thinks it might be possible as they already have a 3D sensor. He would also like to be able to directly estimate it as a new product. He explains: “Sometimes red blood cells will pop, they will lyse, which basically makes them not useful. There are companies that try to salvage the red blood cells from blood and put them back into patients, so it helps to know how many of those blood cells have lysed. It turns out there are some visual differences between hemolyzed blood and non-hemolyzed blood because the blood cells will scatter light, whereas once they have lysed they will not, so that would be a really cool product to have.” Finally, Drew tells us that they have had some great feedback from the patients themselves: “We have had some pretty cool discussions where we have actually had patients tweet out. If you look on our Twitter feed as well as on our website, you will see different situations where patients have reached out to us and thanked us for coming up with the technology.” 5 Gauss Surgical Bay Vision
Made with FlippingBook
RkJQdWJsaXNoZXIy NTc3NzU=