ICCV Daily 2021 - Thursday

This work seeks to recover high-quality ground truth images from low- quality measurements data . The novelty is that it solves this problem without using any ground truth data. “ We’re doing unsupervised learning , ” Dongdong tells us. “ We exploit very mild prior knowledge about the signal set. Basically, some symmetries and low dimensionality. We find these simple mild priors could make your system work very well and you can learn the ground truth from only the measurements data. ” In most image problems, there is no ground truth for training. Previous techniques have collected labeled data for training, but it is very expensive, especially for medical imaging, and often involves asking patients to scan again and again. This paper seeks to solve that challenge . “ We don’t need to train a neural network without using any ground truth data, ” Dongdong explains. “ We’re using the measurements data . That’s one of the biggest advantages of this paper. Another thing is its totally end to end. We don’t need iterative optimization, we don’t have many hyperparameters to play with, and it’s very easy to implement. ” Dongdong points out the most challenging part of this work was analyzing the prior or the structure of the data . It is important to have clear prior knowledge of the signal set. Without the ground truth, it is very difficult to know what kind of properties the signal set should have. For medical images, the ground truth data could be an invariant to the rotation or the shift. Another challenge is how to make reasonable transformations to learn such a reconstruction function. This technique requires very carefully selected transformations. 14 DAILY ICCV Thursday Oral Presentation Equivariant Imaging: Learning Beyond the Range Space Dongdong Chen is a postdoc research associate at the University of Edinburgh. His work explores learning high-quality data from a low-quality input. It has been accepted as an oral presentation this year and he speaks to us ahead of his live Q&A session today.

RkJQdWJsaXNoZXIy NTc3NzU=